![]() GRAPHICAL INTERFACE AND METHOD FOR MANAGING THE GRAPHICAL INTERFACE WHEN TACTILE SELECTING A DISPLAY
专利摘要:
The invention proposes a touch interface (1) comprising a screen (6) able to detect the approach and the position of a finger (11) of a user, the interface being configured to display on the screen (6). ) at least one graphic element (8, Ci_0, C2_0, C3_0, C4_0, C1_0, Cj1_0) associated with a touch selection zone, initially surrounding an anchor point (9, B1_0, B21_0, B3_0, B4_0, B1_0, Bj1_0 ) of the graphic element on the screen. The interface (1) is configured to estimate a trajectory (traj (t)) of a point Dxyz (t)) of the finger (11) and the point of impact (Pxy (t)) of this trajectory on the screen, and is configured to, when the distance (Distance1 (t)) between the anchor point (B1_0) and the point of impact (Pxy (t)) becomes lower than a first threshold, move the graphic element ( C1 (t)) towards the point of impact (Pxy (t)). 公开号:FR3028967A1 申请号:FR1461286 申请日:2014-11-21 公开日:2016-05-27 发明作者:Stephane Regnier 申请人:Renault SAS; IPC主号:
专利说明:
[0001] 1 Graphic interface and management method of the graphic interface during the tactile selection of a displayed element The invention relates to tactile interfaces, in particular touch interfaces embedded on board motor vehicles, or tactile interfaces used to control systems on which a user intervenes while being forced to keep his attention focused on other tasks, for example on the monitoring of a production machine. In such configurations, the user must interact with the touch interface while keeping his or her attention largely available for other tasks than operating the interface, and, if the interface is not very large, The user may have difficulty selecting the menu item. To select such a menu item, it must apply its finger to a given location of the interface corresponding to a touch selection area of the menu item, highlighted on the screen by an icon or more generally a graphic symbol displayed on the screen substantially at the location of the touch selection area. In particular, in a motor vehicle, when the vehicle is in motion, the user's selection gestures, which must constantly monitor the road, may be imprecise. In order to overcome these disadvantages, some manufacturers of mobile terminals have developed large screens or text input systems, in which for example one graphically grows a letter touched by the finger. This magnified display, remotely distanced from the place where the icon to be activated and the finger, is maintained for a short period of time, however long enough for the user to read the letter he has entered, doing a visual check to make sure he made the entry he wanted. This type of display always makes it necessary to perform the entry on a restricted area and different from the screen with each new letter. The object of the invention is to propose a human / machine interface system making it possible to reduce the occurrences of error in the input of the elements of a menu by making it easier for the user to enter the desired graphic element without necessarily increase the size of the screen, and leaving more margin in the gestures to perform to make the entry. To this end, the invention provides a graphical interface or touch interface comprising a screen capable of detecting the approach and position of a user's finger. The detection is preferably within a predefined volume, characterized in particular by a detection threshold distance from the screen. Detection is therefore at least in a predefined volume, but could extend to a wider space. The interface is configured to display on the screen at least one graphic element associated with a touch selection area, surrounding an anchor point of the graphic element on the screen. The interface is configured to estimate a trajectory of a point of the finger and the point of impact of that trajectory on the screen, and is configured to move the graphical element towards the point of impact when the distance between the anchor point and the point of impact becomes less than a first threshold. The direction of movement of the graphic element may be defined by the displacement of a particular point of the graphic element, which is subsequently designated in the anchor description in its initial position, and by centering point when moved. Here, the term "screen" of the tactile interface, or graphical interface, denotes three two-dimensional space regions that are superimposable on each other, with a possible coordinate change calculation managed by an electronic control unit managing the detection operations performed by the touch interface as well as the displays on the screen of the touch interface. [0002] The first of these three regions is constituted by the display screen itself, to display the graphic elements to the attention of the user to indicate the regions of the space with which it must interact . [0003] The second region is a tactile interface or graphic interface detection unit, associated with a sensitive flat surface of the touch screen type, superimposed on the display screen, or associated with another space detection system. allowing in particular to detect the position of the finger of the user in the vicinity of the display screen 10, then identified in the proper coordinates of the detection interface. The third region is defined by point coordinate values of a virtual reference screen, stored by the electronic control unit and grouped by regions of the virtual screen, expressed in a unit-specific coordinate system. electronic control. These regions of the virtual screen are for example defined by surface areas or sets of boundary lines. With respect to these regions, the initial anchor points of the graphical elements are stored, then their subsequent centering points are calculated, and at each instant the coordinates of the other points of each graphical element are calculated, which can then be translated into the graph. coordinate system of the display screen. The position of the finger in the coordinates of the detection interface can be, for example, translated into the coordinate system of the electronic control unit to identify its position with respect to the different boundaries, and then to calculate the positions over time. centering points of the displayed graphic elements, which are then translated into positions on the display screen. [0004] Both the graphical element and the tactile selection area initially surround the anchor point, but may no longer surround this anchor point once moved. The position of the anchor point of the graphic element remains fixed during the display of a given selection menu during the interaction between the finger and the interface, until eventually a contact your finger on the screen triggers the display of another selection menu. The operator's finger can be replaced in an equivalent manner by an element detectable by the interface, for example an elongated object, such as a stylus, adapted to allow the detection of a particular geometrical point by the touch interface. The anchor point of the graphical element is a particular point of the screen associated with that graphical element, preferably contained within the boundaries of the graphical element for a reference display state of the graphical element. interface, corresponding for example to the display of a particular selection menu in the absence of interaction in progress with the finger. The graphic element may be a surface or line display pattern, having one or more colors. Preferably, the graphic element defines a contour or surface visible on the screen and coinciding with the contours or surface of the associated touch selection zone. The anchor point may typically be a geometric barycenter of a surface or contour defining the visible boundaries of the graphic element in its display position in the absence of interactions. According to some embodiments, the anchor point of the graphic element may be eccentric with respect to the graphic element, for example may be offset to an edge of the screen whose graphic element is the closest, so to limit the risks of overlap between the graphical element moved and the edge of the screen. According to alternative embodiments, the distance threshold may be the same in all directions around the anchor point. According to another variant embodiment, the distance threshold may be a function of the angular position of the point of impact around the anchor point, so as to define a non-circular boundary within which the detection of the point of impact triggers a displacement of the graphic element. A specific function can then be associated with each graphical element of the interface. The display of the graphical element translated at a given time is preferably substituted for the initial display of the graphical element and the other previous displays, if any, of the graphical element. The interface may be configured to, when crossing the first threshold, calculate a translation vector of the anchor point to a temporary centering point, and to perform a corresponding translation of the display of the graphic element. By temporary centering point is meant a point of the screen on which is centered the graphical element during at least certain phases of interaction of the interface with the finger. Centering is understood here in the broad sense of the term, the centering point being for example a surface barycentre, or a barycentre of certain characteristic points of the graphic element, the weighting coefficients of this barycentre being constant but not necessarily 15 equal from one characteristic point to another. The representation of the graphic element around the temporary centering point can then be a translation, a homothety, or can be a bidirectional expansion of the initial representation of the graphic element around its anchor point. In a preferred embodiment, this anchor point and the temporary centering point are not apparent on the displayed graphical element. The ratio of the homothety or the ratios of the expansion are preferably greater than or equal to 1 as soon as the centering point no longer coincides with the initial anchor point. The temporary centering point is located on a line between the anchor point and the point of impact. The first distance threshold may be a variable function of an angular position of the point of impact around the anchor point. In other words, the distance threshold defines a non-circular boundary around the anchor point, this boundary delimiting a domain of influence within which the interface is configured to activate the displacement of the display of the element. graphic. The interface may be configured to calculate the position of the temporary centering point as a center of gravity between the initial anchor point and the point of impact, the relative distance between the centering point and the point of impact being a increasing function of the distance between the finger and the screen. Barycenter is here understood to mean a barycentre weighted between two points with weighting coefficients which may be variable functions, for example a function of the distance between the finger and the screen. According to another variant embodiment, the relative distance between the centering point and the point of impact is an increasing function of the distance between the finger and the point of impact. Preferably, this function is canceled when the finger touches the screen. Relative distance or relative approach distance is here referred to as the ratio of the displacement applied to the centering point to the distance between the anchor point and the point of impact. According to a first variant embodiment, this relative approach distance does not depend on the distance between the point of impact and the anchor point. According to another variant embodiment, this relative distance can be decreasing with the distance between the point of impact and the anchor point, for example if the graphic element is of reduced size compared to the distance separating two graphic elements. Thus, the finger can be positioned above the graphic element even when the latter has moved away from the zone of influence-defined further-of another graphic element. According to another variant embodiment, this relative distance can be increasing with the distance between the point of impact and the anchor point, for example if the graphic element is of comparable size to the distance separating two graphic elements. This avoids the "displaced" graphical element from encroaching unduly on the zones of influence of neighboring graphical elements. According to yet another embodiment, this relative approach distance may further, as the first distance threshold, be a variable function of an angular position of the point of impact around the anchor point. [0005] The interface may be configured to, when crossing the first threshold, display the graphical element translated by dilating this graphic element, along at least one direction, according to a magnification factor. The expansion may correspond to a bidirectional homothety, but may, in certain embodiments, correspond to an expansion having two different ratios along two perpendicular axes of the screen. The dilation may correspond to a unidirectional dilation. For example, if the graphic element is near a display edge of the screen, the graphical element may be further expanded, or expanded only in the direction perpendicular to that edge, to delay the moment when the graphical element overlaps the edge of the screen if the finger approaches this edge. The magnification factor, i.e. the ratio of the homothety or the highest ratio of the bidirectional expansion, is preferably between 1.1 and 1.5, and preferably between 1, 15 and 1.35. The graphic element can alternatively, or in addition to the change in size, be highlighted by a change in brightness, contrast, color, filling pattern. [0006] Advantageously, the interface may be configured to, when crossing the first threshold, display the graphical element translated to the new position corresponding to the crossing of the threshold, and then, when the distance between the anchor point and the point of impact becomes less than the first threshold, periodically calculating a new translation vector taking into account each time an updated impact point, and to display the translated element of the corresponding vector. According to a first embodiment, the size of the graphical element displayed remains constant as long as the centering point differs from the anchor point. According to another embodiment, the interface may be configured to then reduce the size of the graphic element to an intermediate size between its initial size and its size at the time of crossing the first distance threshold. According to yet other embodiments, the interface may be configured to continue to increase the size of the graphics element 30 once the threshold has been crossed. Preferably, the interface is configured to move with the graphical element displayed, the area taken into account as touch selection area. [0007] The displacement is made according to the same vector connecting the anchor point to the temporary centering point. If the graphic element is displayed dilated by one or by magnification factors, the touch sensitive area is dilated with the same magnification factors. [0008] 5 The interface can be configured to, when the distance between the anchor point and the point of impact becomes greater than a second threshold, return the display of the graphic element to its initial position around the anchor point . The second threshold may be identical to the first threshold. The display of the graphic element also returns to its original size and aspect (brightness, color, graphic relief effect, etc.) before crossing the first distance threshold. According to an alternative embodiment, the return to the original display can be performed only if the exceeding of the first threshold is prolonged beyond a delay threshold. According to the variant embodiments, from the instant of its release, the return to the original display can be carried out without transition, or can be performed by displaying the graphic element at a sequence of positions, and with a series of intermediate sizes, between the initial position and the last position before the release of the return. The return trajectory 20 can be done for example in a straight line, with a gradual decrease in the size of the graphic element between its size when the return is triggered, and its initial size excluding interactions with the finger. According to another variant embodiment, the size of the graphic element can be reduced to its initial size as soon as the return of the graphic element to its original position begins. According to a particular embodiment, the interface can be configured to display on the touch screen, at least a first graphic element associated with a first touch selection zone, a first anchor point, a first function of first 30 threshold distance and to display at least a second graphic element associated with a second touch selection zone, a second anchor point and a second distance threshold first function. The first and second first distance threshold functions define around the first and second anchor points, respectively, a first boundary of a first influence domain and a second boundary of a second influence domain. The interface may then be configured to allow, at least at times, selection by finger contact at a touch point 5 in the first touch selection area, while the first touch selection area temporarily overlaps the second area of touch. influence and that the point of contact is in the second domain of influence. The invention furthermore proposes a method of managing a touch interface, the interface being able to detect the approach and the position of a user with respect to a screen of the interface, in which: in a first step on the screen at least one graphic element associated with a touch selection zone, surrounding an anchor point of the graphic element on the screen, and lying inside a same area of influence; it is repeatedly estimated a trajectory of a point of the finger and the point of impact of this trajectory on the screen, when the point of impact enters the zone of influence, the graphical element is displaced displayed and the associated touch selection area towards the point of impact; since the point of impact remains in the zone of influence, the graphic element and the tactile selection zone associated with a position which is a function of the updated point of impact are displayed, and which is all the more near the point of impact as the finger approaches the screen. By convention, the distance between the point of impact and the graphic element is measured, such as the distance between the point of impact and a centering point obtained by applying to the initial centering point, the translation vector of the point of impact. graphic element. The relationship between the proximity of the finger to the screen and the proximity between the point of impact and the displaced equivalent of the anchor point is not necessarily linear. [0009] Other objects, features and advantages of the invention will appear on reading the following description, given solely by way of nonlimiting example, and with reference to the appended drawings, in which: FIG. a motor vehicle equipped with an interface according to the invention, - Figure 2 illustrates a man-machine interface according to the invention, and - Figure 3 is a characteristic graph of one of the operating modes of the interface of FIG. 2. As shown in FIG. 1, a tactile interface according to the invention can for example be embedded in a motor vehicle 3 driven by a user 4 who, by moving his finger and touching certain points of a screen of a touch interface 1 is thus able to issue instructions to an electronic control unit 2 for actuating different equipment of the vehicle, for example a ventilation system 5 of the vehicle or all t other vehicle equipment. The electronic control unit 2 can also send back to the touch interface messages reflecting the operating state of the vehicle so that the user 4 of the vehicle can take into account this data. Figure 2 illustrates the operating principle of a touch interface 1 according to the invention. The touch interface 1 typically comprises a touch screen 6, delimited by edges 7 and a detection system (not shown) making it possible to detect the position of a finger 11 of a user, in particular of a particular point Dxy, of this finger, and to detect whether or not this finger is in contact with the touch screen. By touch screen is meant here any capture system by moving a finger and approaching the finger of a validation surface. The invention can for example be applied to detection systems optically projecting information on an inert surface and observing the neighboring volume of this surface by means of various optical or infrared sensors, for example, in order to detect the position of a finger and to detect whether or not the finger is in contact with the surface. The touch screen 6 is typically delimited by means of boundaries 10 designated here by F1 4, F1 2, F4 1 / F2 1, F2 3, F1 i in regions 5 or zones of influence, which are in FIG. 2 referenced R1 , R2, R3, R4, R1 and Rj. Each region corresponds to a selection area of a menu displayed on the touch screen. The regions likely to give rise to a validation act each display a graphic element referenced herein 8 and more particularly referenced according to the regions C1 0, C2 0, C3 0, C4 0, CI 0, Ci 0. These graphic elements indexed in 0 correspond to an initial display of the menu on the graphic screen. The graphical interface is configured to detect the movement of the finger 11 and in particular of an end 1), (3 ,, of this finger which at a time t is at the point Dxyz (t) and at the next instant t + dt is located at a point Dxyz (t + dt) The graphical interface is capable of determining, for example by extrapolation of the successive detected points, a trajectory which is re-estimated at each instant and which is noted traj (t) on FIG. 2 for the estimated trajectory at time t, and which is denoted traj (t + dt) for the estimated trajectory at time t + dt. Each of these calculated trajectories defines a point Px), which the here we denote by point of impact, although the impact remains primarily theoretical. This point Px), is the intersection of the trajectory and a contact surface of the screen which can coincide with the display surface of the screen 6. [0010] The invention proposes, when the point of impact is sufficiently close to one of the graphic elements, to modify the display of the graphic element and to bring it closer to the point of impact, in order to facilitate the work of the user. user who can continue to validate the corresponding menu option without deporting his finger from the current trajectory. To do this, arbitrary definition for each graphic element is a virtual anchor point 9, which may not appear on the display, and which serves both to estimate the distance between the graphic element and the point of impact. and to calculate the subsequent displacements of the display of the graphic element. [0011] In FIG. 2, these anchor points are marked respectively by the references B1 0, for the graphical element C1 0, B2 0 for the graphical element C2 0,... Bi 0 for the graphical element Ci 0. [0012] These anchoring points may for convenience correspond to a surface barycentre of the graphic element, or to a barycenter of an outline of the graphic element. According to an alternative embodiment, they may optionally be located arbitrarily near one of the boundaries of the graphic element. [0013] To determine whether the display of the graphical element C1 0 of the region R1 must be moved, the distance noted here margin (t) can be compared either with a constant threshold or with a threshold which depends on the direction. from the straight line connecting the centering point B1 0 and the point of impact Pxy (t). For example, it can be checked whether the point of impact is within the boundaries delimiting the region in which the graphic element in question is located in its initial state in the absence of interaction with the finger. In FIG. 2, the distance at an instant t from the point of impact Pxy (t) of the trajectory traj (t) calculated at this instant t, and from the anchoring point B1 0 is denoted Ecarti (t). As a function of this distance, and also as a function of the distance of the finger on the screen, we apply to the graphical element C1 0 a displacement noted here Ui (t), which at a given moment corresponds to a vector joining the point anchor B1 0 and a temporary centering point Bi (t) occupying with respect to the displaced graphical element Ci (t) the same barycentric position as the anchor point B1 0 initially occupies with respect to the graphic element C1 0 in its initial display configuration. At a later time t + dt, the re-calculated trajectory defines a new impact point Pxy (t + dt) whose position is used together with the distance of the finger on the screen to calculate a new position of the centering Bi (t + dt) of the graphical element Ci (t + dt) displayed at that time. In order to improve the perception by the user of the graphical element that is about to be selected, it is possible, as soon as the displacement of the display of the graphical element is activated, to accompany this movement of the graphical element. an expansion of the dimensions of all on the screen, the graphic element, for example a homothety following directions or possibly, depending on the available space 5 dilation in one of the directions of the screen. The size of the graphical element can then be constant as long as the displacement of the graphical display continues to be effective. Depending on the size of the element maintaining the graphical element and the amplitude of displacement, it may happen that the graphical element overlaps one of the boundaries between regions. For example in FIG. 2, the graphic element Ci (t + dt) is about to overlap the border F1 2. Each graphic element is associated with a tactile selection zone, which when it is touched by the finger of the user triggers an action corresponding to one of the options of the menu displayed on the screen. Preferably, the touch selection zone coincides with the area occupied by the graphic element. The interface according to the invention can be configured so that, if following the trajectory performed by the finger, the graphic element and the associated tactile selection area overlap one of the borders and the finger comes into contact with the finger. screen at a point of the graphical element displayed at this time, a validation on the associated touch selection zone is then taken into account, even if the point of contact is at this moment beyond the border of the associated region to the graphic element. [0014] In this way, the input of the user is facilitated, since in some way the effective border of the admitted region for selecting a menu item is deformed to a certain extent depending on the trajectory of the user's finger. to broaden the total selection region admitted by temporarily moving its boundaries. FIG. 3 illustrates an example of a graph 20 connecting the amplitude of displacement here noted Ui (t) of a graphic element on an interface according to the invention, as a function of a distance h (t) between a finger of the user and the screen, and a distance Ecarti (t) between the 3028967 14 point of impact of the trajectory of the finger and the initial anchor of the graphic element. The mapped area 21 is here chosen to cancel any displacement of the graphic element when the distance of the finger on the screen exceeds a certain threshold θ 1, which can typically be the detection threshold distance of the touch interface. The mapped surface 21 is also chosen to cancel any displacement of the graphic element when the point of impact is close to the anchor point, since there is no longer any need to move the graphic element. Typically, the displacement value U i (t) can be chosen as a product between the distance Ecarti (t) and a function which is chosen as a function increasing the distance of the finger on the screen, a function which is canceled out for a threshold value ho of distance. One of the possible forms of functions for defining the displacement vector Ui (t) is to directly multiply the distance Ecarti (t) by a concave or convex function of the distance h from the finger to the screen. This concave or convex function can be for example a power of the difference to 1, a ratio between the distance h of the finger and the threshold distance h 0. If we choose for this function a power 1/2, we obtain the expression proposed in equation (1) and corresponding to the graph of FIG. 3: U i (t) = Dist (B1 0, 13, (t )) x V (1 h (t) I h0) = Deviation, [(t) x 1 (1 h (t) 1 h0) Equation (1) The advantage of choosing such a form of convex function is that there is a "slowing down" effect of the displacement of the graphic element when the finger is in the immediate vicinity of the screen, which avoids disturbing the user before the final selection. Another variant of function U (t) can be envisaged, in which a power function is also applied to the distance Ecarti (t) between the anchor point and the point of impact, so as to slow the movement of the graphic element when approaching the bounds of the region associated with the graphic element considered. The distance from the finger to the screen can be taken as an orthogonal distance h from the finger to the screen, as shown in FIGS. 2 and 3, but it is possible to envisage variant embodiments in which the distance from the finger to the screen is taken as the distance between the point 1), (3 ,, of the finger 5 closest to the screen and the point of impact Pxy (t) of the trajectory at this moment. Another point of view is to define a relative distance between the centering point Bi (t) of the graphic element and the point of impact Pxy (t) as a distance ratio Ai (t), defined as: 01 (t) = Ecarti (t) -U1 (t) Ecarti (t) Equation (2) This relative distance gives the remaining gap to be traveled to the graphical element so that this graphical element is centered on the point This relative distance decreases when the finger approaches the screen and is canceled when the finger touches the screen.The invention is not limited to the examples of embodiment described and can be declined in many variants. The position of the finger relative to the interface can be detected by any tactile means or any selection means by positioning the end of a finger. The functions for calculating the displacement of a graphic element may differ from those cited in the examples. Timers may be introduced at certain stages of the modified display process of the at least one graphical element. Modes for transparently displaying one graphical element relative to another may be provided if two graphical elements are to be displayed on interfering zones.
权利要求:
Claims (10) [0001] REVENDICATIONS1. Touch interface (1) comprising a display screen (6), the interface being able to detect the approach and the position of a finger (11) of a user relative to the screen, the interface being configured to display on the screen (6) at least one graphic element (8, C1 0, C2 0, C3 0, C4 0, CI 0, C 0) associated with a touch selection zone, surrounding an anchor point (9, B1 0, B21 0, B3 0, B4 0, 0, Bji 0) of the graphic element on the screen, characterized in that the interface (1) is configured to estimate a trajectory (traj (t )) of a point (Dxyz (t)) of the finger (11) and the point of impact (13'y (t)) of this trajectory on the screen, and is configured to move the graphical element (Ci (t)) towards the point of impact (13'y (t)), when the distance (Ecarti (t)) between the anchor point (Bi 0) and the point of impact (13'y ( t)) becomes lower than a first threshold. [0002] 2. Touch interface according to claim 1, configured for, when crossing the first threshold, calculate a translation vector (Ui (t)) of the anchor point (Bi 0) to a temporary centering point (Bi (t) ), and to perform a corresponding translation of the display (Ci (t)) of the graphic element. [0003] The touch interface of claim 2, wherein the first distance threshold is a variable function of an angular position of the point of impact (13'y (t)) around the anchor point (Bi o). [0004] 4. Touch interface according to one of claims 2 or 3, configured to calculate the position of the temporary centering point (Bi (t)) as a barycentre between the initial anchor point (Bi 0) and the point of impact (13'y (t)), the relative distance between the center point and the point of impact being an increasing function of the distance between the finger and the screen. [0005] 5. Touch interface according to one of claims 2 to 4, configured for, when crossing the first threshold, display the graphical element translated (Ci (t)) by expanding this graphic element, in at least one direction, according to a magnification factor. 3028967 17 [0006] 6. A touch interface according to any one of claims 2 to 5, configured for, when crossing the first threshold, display the translated graphical element (Ci (t)) at the new position corresponding to crossing the threshold, then, when the distance between the anchor point and the point of impact becomes less than the first threshold, periodically calculating a new translation vector (Ui (t)) each taking into account an impact point (13'y ( t)), and to display the graphical element (Ci (t)) translated from the corresponding vector (Ui (t)). 10 [0007] 7. Touch interface according to any one of the preceding claims, configured to move with the displayed graphical element (Ci (t)), the touch sensitive selection area. [0008] 8. A touch interface according to any one of the preceding claims, configured for, when the distance (Ecarti (t)) between the anchor point (Bi 0) and the point of impact (13'y (t)) becomes greater than a second threshold, returning the display of the graphical element (Ci (t)) to its initial position (Ci 0) around the anchor point (Bi o). [0009] 9. Touch interface (1) according to claim 7, configured to display on the touch screen (6) at least a first graphical element (Ci 0) associated with a first touch selection zone, a first anchor point (Bi 0), to a first function of first distance threshold and to display at least a second graphic element (C2 o) associated with a second touch selection zone, a second anchor point (B2 o) and a second second first distance threshold function, the first and the second first distance threshold functions defining around the first (Bi 0) and the second (B2 o) anchor points, respectively a first boundary of a first domain, and a second boundary of a second domain of influence, the interface being configured to allow, at least at times, finger contact selection (11) at a point of contact in the first zone of contact. touch selection, while the first Tactile selection area temporarily overlaps the second domain of influence and the point of contact is in the second domain of influence. 3028967 18 [0010] 10. A method of managing a touch interface (1) adapted to detect the approach and the position of a finger (11) of a user relative to a screen (6) of the interface, wherein: in a first step on the screen (6) at least 5, a graphic element (Ci 0) associated with a touch selection zone, surrounding an anchor point (Bi 0) of the graphic element (Ci 0) on the screen (6), and being within the same zone of influence; one repeatedly estimates a trajectory (traj (t)) of a point (Dxyz (t)) of the finger and the point of impact (13'y (t)) of this trajectory 10 (traj (t)) on the screen (6), - when the point of impact (13'y (t)) enters the zone of influence, the displayed graphical element (Ci (t)) and the tactile selection zone are moved associated in the direction of the point of impact (13'y (t)); since the point of impact (13'y (t)) remains in the zone of influence, the graphical element (Ci (t)) and the associated touch selection zone are displayed in a position which is function of the updated impact point (13'y (t + dt)), which is all the closer to the point of impact as the finger (11) approaches the screen (6).
类似技术:
公开号 | 公开日 | 专利标题 FR3028967A1|2016-05-27|GRAPHICAL INTERFACE AND METHOD FOR MANAGING THE GRAPHICAL INTERFACE WHEN TACTILE SELECTING A DISPLAYED ELEMENT EP3221780A1|2017-09-27|Graphical interface and method for managing said graphical interface during the touch-selection of a displayed element US10409443B2|2019-09-10|Contextual cursor display based on hand tracking US8284168B2|2012-10-09|User interface device KR101488121B1|2015-01-29|Apparatus and method for user input for controlling displayed information US8508347B2|2013-08-13|Apparatus and method for proximity based input US8760432B2|2014-06-24|Finger pointing, gesture based human-machine interface for vehicles US9027153B2|2015-05-05|Operating a computer with a touchscreen JP2012003764A|2012-01-05|Reconfiguration of display part based on face tracking or eye tracking US20170243389A1|2017-08-24|Device and method for signalling a successful gesture input EP3079042B1|2020-06-24|Device and method for displaying screen based on event FR2851347A1|2004-08-20|Man-machine interface device for use in vehicle control panel, has touch screen moved with respect to case by actuator based on displacement patterns controlled by analysis and treatment unit to produce touch differential effects EP2956846B1|2020-03-25|Method, device and storage medium for navigating in a display screen US20150033193A1|2015-01-29|Methods for modifying images and related aspects FR3031717A1|2016-07-22|MOTOR VEHICLE STEERING SYSTEM AND METHOD OF MANUFACTURING THE SAME EP2731002A1|2014-05-14|Method for securing a command on a viewing device with tactile surface and related system FR3005173A1|2014-10-31|INTERACTION METHOD IN AN AIRCRAFT COCKPIT BETWEEN A PILOT AND ITS ENVIRONMENT FR3053488A1|2018-01-05|CONTROL METHOD AND CONTROL INTERFACE FOR MOTOR VEHICLE US10042445B1|2018-08-07|Adaptive display of user interface elements based on proximity sensing US8667425B1|2014-03-04|Touch-sensitive device scratch card user interface EP2936293B1|2019-04-10|Air traffic control operator terminal FR3030798A1|2016-06-24|METHOD FOR MANAGING AN INPUT DEVICE AND INPUT DEVICE APPLIED TO A MOTOR VEHICLE FOR CARRYING OUT THE METHOD JP5912177B2|2016-04-27|Operation input device, operation input method, and operation input program CN107335218B|2021-02-19|Game scene moving method and device, storage medium, processor and terminal EP2936284A1|2015-10-28|Interface module making it possible to detect a gesture
同族专利:
公开号 | 公开日 US20170364243A1|2017-12-21| CN107209637A|2017-09-26| CN107209637B|2020-07-07| KR102237363B1|2021-04-07| WO2016079433A1|2016-05-26| EP3221781A1|2017-09-27| KR20170086101A|2017-07-25| FR3028967B1|2017-12-15| US10191630B2|2019-01-29|
引用文献:
公开号 | 申请日 | 公开日 | 申请人 | 专利标题 US20090201246A1|2008-02-11|2009-08-13|Apple Inc.|Motion Compensation for Screens| EP2105826A2|2008-03-25|2009-09-30|LG Electronics Inc.|Mobile terminal and method of displaying information therein| US20110157040A1|2009-12-24|2011-06-30|Sony Corporation|Touchpanel device, and control method and program for the device| US20110285665A1|2010-05-18|2011-11-24|Takashi Matsumoto|Input device, input method, program, and recording medium| US20120120002A1|2010-11-17|2012-05-17|Sony Corporation|System and method for display proximity based control of a touch screen user interface| US20140028557A1|2011-05-16|2014-01-30|Panasonic Corporation|Display device, display control method and display control program, and input device, input assistance method and program| FR3002052A1|2013-02-14|2014-08-15|Fogale Nanotech|METHOD AND DEVICE FOR NAVIGATING A DISPLAY SCREEN AND APPARATUS COMPRISING SUCH A NAVIGATION| US8624836B1|2008-10-24|2014-01-07|Google Inc.|Gesture-based small device input| US8261211B2|2009-10-01|2012-09-04|Microsoft Corporation|Monitoring pointer trajectory and modifying display interface| US20110205151A1|2009-12-04|2011-08-25|John David Newton|Methods and Systems for Position Detection| JP5654118B2|2011-03-28|2015-01-14|富士フイルム株式会社|Touch panel device, display method thereof, and display program| KR20140114913A|2013-03-14|2014-09-30|삼성전자주식회사|Apparatus and Method for operating sensors in user device| JP2014183425A|2013-03-19|2014-09-29|Sony Corp|Image processing method, image processing device and image processing program| CN103513886A|2013-04-27|2014-01-15|展讯通信(上海)有限公司|Touch control device and target object moving method and device of touch control device| BR112017007976A2|2014-10-22|2018-01-23|Telefonaktiebolaget Lm Ericsson|method and device for providing a touch-based user interface| FR3028968B1|2014-11-21|2016-11-25|Renault Sa|GRAPHICAL INTERFACE AND METHOD FOR MANAGING THE GRAPHICAL INTERFACE WHEN TACTILE SELECTING A DISPLAYED ELEMENT| GB2551520B|2016-06-20|2018-11-21|Ge Aviat Systems Ltd|Correction of vibration-induced error for touch screen display in an aircraft| US10732759B2|2016-06-30|2020-08-04|Microsoft Technology Licensing, Llc|Pre-touch sensing for mobile interaction|CN109388464B|2018-09-28|2022-03-08|广州视源电子科技股份有限公司|Element control method and device| US10795463B2|2018-10-22|2020-10-06|Deere & Company|Machine control using a touchpad| CN109814787B|2019-01-29|2021-04-06|广州视源电子科技股份有限公司|Key information determination method, device, equipment and storage medium| US11132821B1|2020-05-26|2021-09-28|Adobe Inc.|Providing graphical user interface tools for efficiently selecting handles in vector artwork on touch-based devices| US20210382602A1|2020-06-05|2021-12-09|International Business Machines Corporation|Automatically correcting touchscreen errors|
法律状态:
2015-11-19| PLFP| Fee payment|Year of fee payment: 2 | 2016-05-27| PLSC| Publication of the preliminary search report|Effective date: 20160527 | 2016-11-18| PLFP| Fee payment|Year of fee payment: 3 | 2017-11-21| PLFP| Fee payment|Year of fee payment: 4 | 2019-11-20| PLFP| Fee payment|Year of fee payment: 6 | 2020-11-20| PLFP| Fee payment|Year of fee payment: 7 | 2021-11-22| PLFP| Fee payment|Year of fee payment: 8 |
优先权:
[返回顶部]
申请号 | 申请日 | 专利标题 FR1461286A|FR3028967B1|2014-11-21|2014-11-21|GRAPHICAL INTERFACE AND METHOD FOR MANAGING THE GRAPHICAL INTERFACE WHEN TACTILE SELECTING A DISPLAYED ELEMENT|FR1461286A| FR3028967B1|2014-11-21|2014-11-21|GRAPHICAL INTERFACE AND METHOD FOR MANAGING THE GRAPHICAL INTERFACE WHEN TACTILE SELECTING A DISPLAYED ELEMENT| US15/528,359| US10191630B2|2014-11-21|2015-11-19|Graphical interface and method for managing said graphical interface during the touch-selection of a displayed element| KR1020177017074A| KR102237363B1|2014-11-21|2015-11-19|Graphical interface and method for managing said graphical interface during the touch-selection of a displayed element| EP15805596.2A| EP3221781A1|2014-11-21|2015-11-19|Graphical interface and method for managing said graphical interface during the touch-selection of a displayed element| CN201580072446.8A| CN107209637B|2014-11-21|2015-11-19|Graphical interface and method for managing the same during touch selection of displayed elements| PCT/FR2015/053126| WO2016079433A1|2014-11-21|2015-11-19|Graphical interface and method for managing said graphical interface during the touch-selection of a displayed element| 相关专利
Sulfonates, polymers, resist compositions and patterning process
Washing machine
Washing machine
Device for fixture finishing and tension adjusting of membrane
Structure for Equipping Band in a Plane Cathode Ray Tube
Process for preparation of 7 alpha-carboxyl 9, 11-epoxy steroids and intermediates useful therein an
国家/地区
|